43 research outputs found

    Participant responses to virtual agents in immersive virtual environments.

    Get PDF
    This thesis is concerned with interaction between people and virtual humans in the context of highly immersive virtual environments (VEs). Empirical studies have shown that virtual humans (agents) with even minimal behavioural capabilities can have a significant emotional impact on participants of immersive virtual environments (IVEs) to the extent that these have been used in studies of mental health issues such as social phobia and paranoia. This thesis focuses on understanding the impact on the responses of people to the behaviour of virtual humans rather than their visual appearance. There are three main research questions addressed. First, the thesis considers what are the key nonverbal behavioural cues used to portray a specific psychological state. Second, research determines the extent to which the underlying state of a virtual human is recognisable through the display of a key set of cues inferred from the behaviour of real humans. Finally, the degree to which a perceived psychological state in a virtual human invokes responses from participants in immersive virtual environments that are similar to those observed in the physical world is considered. These research questions were investigated through four experiments. The first experiment focused on the impact of visual fidelity and behavioural complexity on participant responses by implementing a model of gaze behaviour in virtual humans. The results of the study concluded that participants expected more life-like behaviours from more visually realistic virtual humans. The second experiment investigated the detrimental effects on participant responses when interacting with virtual humans with low behavioural complexity. The third experiment investigated the differences in responses of participants to virtual humans perceived to be in varying emotional states. The emotional states of the virtual humans were portrayed using postural and facial cues. Results indicated that posture does play an important role in the portrayal of affect however the behavioural model used in the study did not fully cover the qualities of body movement associated with the emotions studied. The final experiment focused on the portrayal of affect through the quality of body movement such as the speed of gestures. The effectiveness of the virtual humans was gauged through exploring a variety of participant responses including subjective responses, objective physiological and behavioural measures. The results show that participants are affected and respond to virtual humans in a significant manner provided that an appropriate behavioural model is used

    A Platform Independent Architecture for Virtual Characters and Avatars

    Get PDF
    We have developed a Platform Independent Architecture for Virtual Characters and Avatars (PIAVCA), a character animation system that aims to be independent of any underlying graphics framework and so be easily portable. PIAVCA supports body animation based on a skeletal representation and facial animation based on morph targets

    The Impact of a Character Posture Model on the Communication of Affect in an Immersive Virtual Environment

    Full text link

    Investigating the effect of relative time delay on companion screen experiences

    Get PDF
    Mobile devices are increasingly used while watching television, leading to the development of companion apps that complement the content of programmes. A concern for these applications is the extent to which companion app and television content need to be temporally aligned for live synchronisation. In this study, 18 participants watched a nature programme while being shown companion content on a tablet. Temporal synchronisation of content between the devices was varied. Participants completed questionnaires measuring immersion and affect and were tested on their recall for companion app content. While there were no statistically significant effects on these measures, qualitative interviews with participants after viewing consistently revealed that longer 10s delays in content synchronisation were frustrating. This suggests that poor content synchronisation can produce a negative companion experience for viewers and should be avoided

    Variations in physiological responses of participants during different stages of an immersive virtual environment experiment

    Full text link
    This paper presents a study of the fine grain physiological responses of participants to an immersive virtual simulation of an urban environment. The analysis of differences in participant responses at various stages of the experiment (baseline recordings, training, first half and second half of the urban simulation) are examined in detail. It was found that participants typically show a stress response during the training phase and a stress response towards the end of the simulation of the urban experience.There is also some evidence that variations in the level of visual realism based the texture strategy used was associated with changes in mental stress. Copyright 2006 ACM

    Social VR: A new medium for remote communication and collaboration

    Get PDF
    We are facing increasingly pressure on reducing travel and working remotely. Tools that support effective remote communication and collaboration are much needed. Social Virtual Reality (VR) is an emerging medium, which invites multiple users to join a collaborative virtual environment (VE) and has the potential to support remote communication in a natural and immersive way. We successfully organized a CHI 2020 Social VR workshop virtually on Mozilla Hubs, which invited researchers and practitioners to have a fruitful discussion over user representations and ethics, evaluation methods, and interaction techniques for social VR as an emerging immersive remote communication tool. In this CHI 2021 virtual workshop, we would like to organize it again on Mozilla Hubs, continuing the discussion about proxemics, social cues and VE designs, which were identified as important aspects for social VR communication in our CHI 2020 workshop

    Comparing and Evaluating Real Time Character Engines for Virtual Environments

    Get PDF
    As animated characters increasingly become vital parts of virtual environments, then the engines that drive these characters increasingly become vital parts of virtual environment software. This paper gives an overview of the state of the art in character engines, and proposes a taxonomy of the features that are commonly found in them. This taxonomy can be used as a tool for comparison and evaluation of different engines. In order to demonstrate this we use it to compare three engines. The first is Cal3D, the most commonly used open source engine. We also introduce two engines created by the authors, Piavca and HALCA. The paper ends with a brief discussion of some other popular engines

    Social VR: A new medium for remote communication and collaboration

    Get PDF
    There is a growing need for effective remote communication, which has many positive societal impacts, such as reducing environmental pollution and travel costs, supporting rich collaboration by remotely connecting talented people. Social Virtual Reality (VR) invites multiple users to join a collaborative virtual environment, which creates new opportunities for remote communication. The goal of social VR is not to completely replicate reality, but to facilitate and extend the existing communication channels of the physical world. Apart from the benefits provided by social VR, privacy concerns and ethical risks are raised when the boundary between the real and the virtual world is blurred. This workshop is intended to spur discussions regarding technology, evaluation protocols, application areas, research ethics and legal regulations for social VR as an emerging immersive remote communication tool

    Piavca: a framework for heterogeneous interactions with virtual characters

    Get PDF
    This paper presents a virtual character animation system for real time multimodal interaction in an immersive virtual reality setting. Human to human interaction is highly multimodal, involving features such as verbal language, tone of voice, facial expression, gestures and gaze. This multimodality means that, in order to simulate social interaction, our characters must be able to handle many different types of interaction, and many different types of animation, simultaneously. Our system is based on a model of animation that represents different types of animations as instantiations of an abstract function representation. This makes it easy to combine different types of animation. It also encourages the creation of behavior out of basic building blocks. making it easy to create and configure new beahviors for novel situations. The model has been implemented in Piavca, an open source character animation system

    Talk to the Virtual Hands: Self-Animated Avatars Improve Communication in Head-Mounted Display Virtual Environments

    Get PDF
    Background When we talk to one another face-to-face, body gestures accompany our speech. Motion tracking technology enables us to include body gestures in avatar-mediated communication, by mapping one's movements onto one's own 3D avatar in real time, so the avatar is self-animated. We conducted two experiments to investigate (a) whether head-mounted display virtual reality is useful for researching the influence of body gestures in communication; and (b) whether body gestures are used to help in communicating the meaning of a word. Participants worked in pairs and played a communication game, where one person had to describe the meanings of words to the other. Principal Findings In experiment 1, participants used significantly more hand gestures and successfully described significantly more words when nonverbal communication was available to both participants (i.e. both describing and guessing avatars were self-animated, compared with both avatars in a static neutral pose). Participants ‘passed’ (gave up describing) significantly more words when they were talking to a static avatar (no nonverbal feedback available). In experiment 2, participants' performance was significantly worse when they were talking to an avatar with a prerecorded listening animation, compared with an avatar animated by their partners' real movements. In both experiments participants used significantly more hand gestures when they played the game in the real world. Conclusions Taken together, the studies show how (a) virtual reality can be used to systematically study the influence of body gestures; (b) it is important that nonverbal communication is bidirectional (real nonverbal feedback in addition to nonverbal communication from the describing participant); and (c) there are differences in the amount of body gestures that participants use with and without the head-mounted display, and we discuss possible explanations for this and ideas for future investigation
    corecore